A Novel Faster RCNN with ODN-Based Rain Removal Technique

نویسندگان

چکیده

During rainy times, the impact of outdoor vision systems gets considerably decreased owing to visibility barrier, distortion, and blurring instigated by raindrops. So, it is essential eradicate from images for ensuring reliability system. To achieve this, several rain removal studies have been performed in recent days. In this view, paper presents a new Faster Region Convolutional Neural Network (Faster RCNN) with Optimal Densely Connected Networks (DenseNet)-based technique called FRCNN-ODN. The presented involves weighted mean filtering (WMF) applied as denoising technique, which helps boost quality input image. addition, RCNN used detection that comprises region proposal network (RPN) Fast model. RPN generates high proposals are exploited detect drops. Also, DenseNet model utilized baseline generate feature map. Moreover, sparrow search optimization algorithm (SSOA) choose hyperparameters namely learning rate, batch size, momentum, weight decay. An extensive experimental validation process highlight effectual outcome FRCNN-ODN investigated results respect dimensions. method produced higher UIQI 0.981 image 1. Furthermore, on 2, achieved maximum 0.982. 0.998 3. simulation showcased superior (Optimal Networks) existing methods terms distinct measures.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Revisiting RCNN: On Awakening the Classification Power of Faster RCNN

Recent region-based object detectors are usually built with separate classification and localization branches on top of shared feature extraction networks. In this paper, we analyze failure cases of state-ofthe-art detectors and observe that most hard false positives result from classification instead of localization. We conjecture that: (1) Shared feature representation is not optimal due to t...

متن کامل

An Implementation of Faster RCNN with Study for Region Sampling

We adapted the join-training scheme of Faster RCNN framework from Caffe to TensorFlow as a baseline implementation for object detection. Our code is made publicly available. This report documents the simplifications made to the original pipeline, with justifications from ablation analysis on both PASCAL VOC 2007 and COCO 2014. We further investigated the role of non-maximal suppression (NMS) in...

متن کامل

Face Detection Using Improved Faster RCNN

Faster RCNN has achieved great success for generic object detection including PASCAL object detection and MS COCO object detection. In this report, we propose a detailed designed Faster RCNN method named FDNet1.0 for face detection. Several techniques were employed including multi-scale training, multi-scale testing, light-designed RCNN, some tricks for inference and a vote-based ensemble metho...

متن کامل

Face Detection using Deep Learning: An Improved Faster RCNN Approach

In this report, we present a new face detection scheme using deep learning and achieve the state-of-the-art detection performance on the well-known FDDB face detetion benchmark evaluation. In particular, we improve the state-of-the-art faster RCNN framework by combining a number of strategies, including feature concatenation, hard negative mining, multi-scale training, model pretraining, and pr...

متن کامل

Multiple Object Tracking Based on Faster-RCNN Detector and KCF Tracker

Tracking and detecting of object is one of the most popular topics recently, which used for motion detection of various objects on a given video or images. To achieve the goal of intelligent navigation of a moving platform operating on the sidewalk, our goal is to build the software that is able to detect the pedestrians and predict their trajectories, during which process MOT (multiple object ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical Problems in Engineering

سال: 2022

ISSN: ['1026-7077', '1563-5147', '1024-123X']

DOI: https://doi.org/10.1155/2022/4546135